6,709 research outputs found

    Transient hydrophobic exposure in the molecular dynamics of Abeta peptide at low water concentration

    Get PDF
    Abeta is a disordered peptide central to Alzheimer's Disease. Aggregation of Abeta has been widely explored, but its molecular crowding less so. The synaptic cleft where Abeta locates only holds 60-70 water molecules along its width. We subjected Abeta40 to 100 different simulations with variable water cell size. We show that even for this disordered aggregation-prone peptide, many properties are not cell-size dependent, i.e. a small cell is easily justified. The radius of gyration, intra-peptide, and peptide-water hydrogen bonds are well-sampled by short (50 ns) time scales at any cell size. Abeta is mainly disordered with 0-30% alpha helix but undergoes consistent alpha-beta transitions up to 14% strand in 5-10% of the simulations regardless of cell size. The similar prevalence in long and short simulations indicate small diffusion barriers for structural transitions in contrast to folded globular proteins, which we suggest is a defining hallmark of intrinsically disordered proteins. Importantly, the hydrophobic surface increases significantly in small cells (confidence level 95%, two-tailed t-test), as does the variation in exposure and backbone conformations (>40% and >27% increased standard deviations). Whereas hydrophilic exposure dominates hydrophobic exposure in large cells, this tendency breaks down at low water concentration. We interpret these findings as a concentration-dependent hydrophobic effect, with the small water layer unable to keep the protein unexposed, an effect mainly caused by the layered water-water interactions, not by the peptide dynamics. The exposure correlates with radius of gyration (R2 0.35-0.50) and could be important in crowded environments, e.g. the synaptic cleft

    A study of purely astrometric selection of extragalactic point sources with Gaia

    Full text link
    Selection of extragalactic point sources, e.g. QSOs, is often hampered by significant selection effects causing existing samples to have rather complex selection functions. We explore whether a purely astrometric selection of extragalactic point sources, e.g. QSOs, is feasible with the ongoing Gaia mission. Such a selection would be interesting as it would be unbiased in terms of colours of the targets and hence would allow selection also with colours in the stellar sequence. We have analyzed a total of 18 representative regions of the sky by using GUMS, the simulator prepared for ESAs Gaia mission, both in the range of 12G2012\le G \le 20 mag and 12G1812\le G \le 18 mag. For each region we determine the density of apparently stationary stellar sources, i.e. sources for which Gaia cannot measure a significant proper motion. The density is contrasted with the density of extragalactic point sources, e.g. QSOs, in order to establish in which celestial directions a pure astrometric selection is feasible. When targeting regions at galactic latitude b30o|b| \ge 30^\mathrm{o} the ratio of QSOs to apparently stationary stars is above 50\% and when observing towards the poles the fraction of QSOs goes up to about 80\sim80\%. We show that the proper motions from the proposed Gaia successor mission in about 20 years would dramatically improve these results at all latitudes. Detection of QSOs solely from zero proper motion, unbiased by any assumptions on spectra, might lead to the discovery of new types of QSOs or new classes of extragalactic point sources.Comment: 4 pages, 4 figures, sent in and accepted for publishing to A&

    Local Decoders for the 2D and 4D Toric Code

    Full text link
    We analyze the performance of decoders for the 2D and 4D toric code which are local by construction. The 2D decoder is a cellular automaton decoder formulated by Harrington which explicitly has a finite speed of communication and computation. For a model of independent XX and ZZ errors and faulty syndrome measurements with identical probability we report a threshold of 0.133%0.133\% for this Harrington decoder. We implement a decoder for the 4D toric code which is based on a decoder by Hastings arXiv:1312.2546 . Incorporating a method for handling faulty syndromes we estimate a threshold of 1.59%1.59\% for the same noise model as in the 2D case. We compare the performance of this decoder with a decoder based on a 4D version of Toom's cellular automaton rule as well as the decoding method suggested by Dennis et al. arXiv:quant-ph/0110143 .Comment: 22 pages, 21 figures; fixed typos, updated Figures 6,7,8,

    Experimental and theoretical investigation of fatigue life in reusable rocket thrust chambers

    Get PDF
    During a test program to investigate low-cycle thermal fatigue, 13 rocket combustion chambers were fabricated and cyclically test fired to failure. Six oxygen-free, high-conductivity (OFHC) copper and seven Amzirc chambers were tested. The failures in the OFHC copper chambers were not typical fatigue failures but are described as creep rupture enhanced by ratcheting. The coolant channels bulged toward the chamber centerline, resulting in progressive thinning of the wall during each cycle. The failures in the Amzirc alloy chambers were caused by low-cycle thermal fatigue. The zirconium in this alloy was not evenly distributed in the chamber materials. The life that was achieved was nominally the same as would have been predicted from OFHC copper isothermal test data

    Survival of the cheapest: How proteome cost minimization drives evolution

    Full text link
    Darwin's theory of evolution emphasized that positive selection of functional proficiency provides the fitness that ultimately determines the structure of life, a view that has dominated biochemical thinking of enzymes as perfectly optimized for their specific functions. The 20th-century modern synthesis, structural biology, and the central dogma explained the machinery of evolution, and nearly neutral theory explained how selection competes with random fixation dynamics that produce molecular clocks essential e.g. for dating evolutionary histories. However, the quantitative proteomics revealed that fitness effects not related to functional proficiency play much larger roles on long evolutionary time scales than previously thought, with particular evidence that some universal biophysical selection pressures act via protein expression levels. This paper first summarizes recent progress in the 21st century towards recovering this universal selection pressure. Then, the paper argues that proteome cost minimization is the dominant, underlying "non-function" selection pressure controlling most of the evolution of already functionally adapted living systems. A theory of proteome cost minimization is described and argued to have consequences for understanding evolutionary trade-offs, aging, cancer, and neurodegenerative protein-misfolding diseases

    Crossing the Logarithmic Barrier for Dynamic Boolean Data Structure Lower Bounds

    Full text link
    This paper proves the first super-logarithmic lower bounds on the cell probe complexity of dynamic boolean (a.k.a. decision) data structure problems, a long-standing milestone in data structure lower bounds. We introduce a new method for proving dynamic cell probe lower bounds and use it to prove a Ω~(log1.5n)\tilde{\Omega}(\log^{1.5} n) lower bound on the operational time of a wide range of boolean data structure problems, most notably, on the query time of dynamic range counting over F2\mathbb{F}_2 ([Pat07]). Proving an ω(lgn)\omega(\lg n) lower bound for this problem was explicitly posed as one of five important open problems in the late Mihai P\v{a}tra\c{s}cu's obituary [Tho13]. This result also implies the first ω(lgn)\omega(\lg n) lower bound for the classical 2D range counting problem, one of the most fundamental data structure problems in computational geometry and spatial databases. We derive similar lower bounds for boolean versions of dynamic polynomial evaluation and 2D rectangle stabbing, and for the (non-boolean) problems of range selection and range median. Our technical centerpiece is a new way of "weakly" simulating dynamic data structures using efficient one-way communication protocols with small advantage over random guessing. This simulation involves a surprising excursion to low-degree (Chebychev) polynomials which may be of independent interest, and offers an entirely new algorithmic angle on the "cell sampling" method of Panigrahy et al. [PTW10]
    corecore